Tutorial � Asma Siddiki L&C VII � sentence processing

Greg Detre

@12 on 22/11/00

 

Presentation

grammar as conventions that allow us to get much more out of words and their combinations

the PoS roles � case-marking/word-order

so you could see �understanding a sentence� as understanding the role played by the individual words, and the relationships between them as modulated by the other little words

 

however, this syntactic processing is bnound up with semantic processing

e.g. in resolving ambiguities = where the tree could branch in different ways or at different places

however, we get the input word-by-word, and we don't wait for it all before going to work

hence �garden path� sentences, where we guess, wrongly

what affects how we guess?

preference for particular grammatical structures?

least �complex� grammar?

most frequently occurring ( simplest) structures

context (�)

constraint-satisfaction of all these factors � if one is missing, rely on the others (e.g. old experiments without context)

 

spoken vs written. lack of prosody, which adds meaning or disambiguates (Tannenhaus timing experiment)

 

mental representations

EC(???) = a gap which can be filled by a trace

trace = a silent fille category

 

Chomsky and deep/surface structure � inflectin + case-marking vs position

locality � the EC is not local to the word- need keep in memory

 

generative grammar � produce novel sentences using finite rules/input etc.

competence < performance

universal grammar = the blueprint of innateness, common to all human language

parameters differ, but share deep structure

levels: Deep + Surface structure

surface structure = what you hear

e.g. J is eager to please (actor)

����������� J is easy to please (it is easy to please J)

�wanna� contractions

which team do you want to/wanna beat

linguistics is descriptive

importance of native speaker

which team do you want ___ to win?

supposedly cannot here contract to �wanna� because there�s a gap inside(???)

who do you think ___ will win the game?

but you can contract to �think�ll� � against Chomsky

 

Derivational theory of complexity

least complex deviation of a sentence is the most accessible/easiest to interpret

easy transformation

minimal attachment � minimising the number of phrases

active sentences processed faster than passive � yes

positive processed faster than negative � yes

 

Slobin implausibility affects the ambiguity and reading

implausible sentences showed all the same effects, so semantics had to be involved

 

Rapid Serial Visual Processing � following eye movements

eye goes back when reading garden path sentences

but this problem goes away if you provide context

 

Gain(???) + the multiple elephans and relative clauses

Grice � don�t add irrelevant info

 

2)�� Separate processing of syntax

usually talk of modularity on a larger scale

which do we process first?

Fodothal(???) � both processed at the end of each clause

Marslen Wilson � both processed together���� ???

 

2 main models

serial � Forster (autonomous) � process words, then syntax

not information flowback, yet syntax does affect word recognition

if 2-way connection with mental lexicon + the GPS

interactive � Marslen Wilson � the different modules interact + swap information with each other

Swinney � access both meanings of a sentence

showed that it primes both, until you reach the end of the sentence

Rainer � eyes go back to main verb with ambiguity unless the semantics cleared things up

discounted full interactive/completely parallel models

ERPs � different reaction to semantic/syntactic implausibility

Gibson 98 � TiCs 2, 7 � constraints on sentence comprehension

 

lexical

contextual � visiting relatives are/is fun

computaitonal resources � locality

frequency � phrase-level

 

Questions

believe (EC)(believe that������� (???)